Goto

Collaborating Authors

 basis system


KAN/H: Kolmogorov-Arnold Network using Haar-like bases

Katayama, Susumu

arXiv.org Artificial Intelligence

This paper proposes KAN/H, a variant of Kolmogorov-Arnold Network (KAN) that uses a Haar-variant basis system having both global and local bases instead of B-spline. The resulting algorithm is applied to function approximation problems and MNIST. We show that it does not require most of the problem-specific hyper-parameter tunings.


Conformal Prediction Bands for Two-Dimensional Functional Time Series

Ajroldi, Niccolò, Diquigiovanni, Jacopo, Fontana, Matteo, Vantini, Simone

arXiv.org Machine Learning

Functional data analysis (FDA) (Ramsay and Silverman 2005) is naturally apt to represent and model this kind of data, as it allows preserving their continuous nature, and provides a rigorous mathematical framework. Among the others, Zhou and Pan 2014 analyzed temperature surfaces, presenting two approaches for Functional Principal Component Analysis (FPCA) of functions defined on a non-rectangular domain, Porro-Muñoz et al. 2014 focuses on image processing using FDA, while a novel regularization technique for Gaussian random fields on a rectangular domain has been proposed by Rakêt 2010 and applied to 2D electrophoresis images. Another bivariate smoothing approach in a penalized regression framework has been introduced by Ivanescu and Andrada 2013, allowing for the estimation of functional parameters of two-dimensional functional data. As shown by Gervini 2010, even mortality rates can be interpreted as two-dimensional functional data. Whereas in all the reviewed works functions are assumed to be realization of iid or at least exchangeable random objects, to the best of our knowledge there is no literature focusing on forecasting time-dependent two-dimensional functional data. In this work, we focus on time series of surfaces, representing them as two-dimensional Functional Time Series (FTS).


Efficient Multidimensional Functional Data Analysis Using Marginal Product Basis Systems

Consagra, William, Venkataraman, Arun, Qiu, Xing

arXiv.org Machine Learning

Modern datasets, from areas such as neuroimaging and geostatistics, often come in the form of a random sample of tensor-valued data which can be understood as noisy observations of an underlying smooth multidimensional random function. Many of the traditional techniques from functional data analysis are plagued by the curse of dimensionality and quickly become intractable as the dimension of the domain increases. In this paper, we propose a framework for learning multidimensional continuous representations from a random sample of tensors that is immune to several manifestations of the curse. These representations are defined to be multiplicatively separable and adapted to the data according to an $L^{2}$ optimality criteria, analogous to a multidimensional functional principal components analysis. We show that the resulting estimation problem can be solved efficiently by the tensor decomposition of a carefully defined reduction transformation of the observed data. The incorporation of both regularization and dimensionality reduction is discussed. The advantages of the proposed method over competing methods are demonstrated in a simulation study. We conclude with a real data application in neuroimaging.